Boosting and hard-core set constructions: a simplified approach

نویسنده

  • Satyen Kale
چکیده

We revisit the connection between boosting algorithms and hard-core set constructions discovered by Klivans and Servedio. We present a boosting algorithm with a certain smoothness property that is necessary for hard-core set constructions: the distributions it generates do not put too much weight on any single example. We then use this boosting algorithm to show the existence of hard-core sets matching the best parameters of Klivans and Servedio’s construction. 1 Boosting and Hard-Core sets Hard-core set constructions are a form of hardness amplification of boolean functions. In such constructions, starting with a function that is mildly inapproximable by circuits of a certain size, we obtain a distribution on inputs such that the same function is highly inapproximable by circuits of size closely related to the original, when inputs are drawn from it. Impagliazzo [3] gave the first hard-core set constructions and used them to give an alternative proof of Yao’s XOR lemma. Klivans and Servedio [4] gave improved hard-core set constructions using a connection to the notion of boosting from learning theory. The goal of boosting is to “boost” some small initial advantage over random guessing that a learner can achieve in Valiant’s PAC (Probabilistically Approximately Correct) model of learning. Klivans and Servedio describe how boosting algorithms which satisfy certain smoothness properties can be generically used to obtain good hard-core set constructions. We give a new algorithm for boosting that enjoys the smoothness properties necessary for hardcore set constructions. This algorithm has the additional desirable property that it has the same number of iterations as the AdaBoost algorithm, and thus is more efficient Servedio’s SmoothBoost algorithm [5]. For this reason, the boosting algorithm should be of independent interest, especially for the applications in Servedio’s paper [5] on learning in the presence of malicious noise. Our boosting algorithm is inspired by Warmuth and Kuzmin’s [7] technique (which, in turn, uses ideas that originated in the work Herbster and Warmuth [2]) of obtaining smooth distributions from any other distribution by projecting it into the set of smooth distributions using the relative entropy as a distance function. We use this boosting algorithm to construct hard-core sets matching the best parameters of the Klivans-Servedio construction. Although we do not improve over the parameters of Klivans and Servedio’s construction, our proof is arguably simpler for three reasons: (a) it obtains the best known parameters for hard-core set constructions directly by applying the boosting algorithm, rather than building it up incrementally by accumulating small hard-core sets, and (b) it obviates 1 Electronic Colloquium on Computational Complexity, Report No. 131 (2007)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distribution-Specific Agnostic Boosting

We consider the problem of boosting the accuracy of weak learning algorithms in the agnostic learning framework of Haussler (1992) and Kearns et al. (1992). Known algorithms for this problem (BenDavid et al., 2001; Gavinsky, 2002; Kalai et al. , 2008) follow the same strategy as boosting algorithms in the PAC model: the weak learner is executed on the same target function but over different dis...

متن کامل

Boosting and Hard-Core Sets

This paper connects two fundamental ideas from theoretical computer science: hard-core set construction, a type of hardness amplification from computational complexity, and boosting, a technique from computational learning theory. Using this connection we give fruitful applications of complexity-theoretic techniques to learning theory and vice versa. We show that the hard-core set construction ...

متن کامل

Investigating the Evolution of the Political Theory of Imam Khomeini (R.A.) with the Lakatos Methodology Approach

In this article, Imam Khomeini’s political theory is investigated with the Lakatos methodology. His methodology confirms the distinction between a hard core or the original skeleton of investments and experimental propositions confirms as a protective belt for the research program. Regarding this, the claim from this article is that if we understand the system of religious knowledge of Imam Kho...

متن کامل

Multiple Classifier Combination through Ensembles and Data Generation

An ensemble of classifiers consists of a set of individually trained classifiers whose predictions are combined when classifying new instances. The resulting ensemble is generally more accurate than the individual classifiers it consists of. In particular, one of the most popular ensemble methods, the Boosting approach, improves the predictive performance of weak classifiers, which can achieve ...

متن کامل

The uniform hardcore lemma via approximate Bregman projections

We give a simple, more efficient and uniform proof of the hard-core lemma, a fundamental result in complexity theory with applications in machine learning and cryptography. Our result follows from the connection between boosting algorithms and hard-core set constructions discovered by Klivans and Servedio [KS03]. Informally stated, our result is the following: suppose we fix a family of boolean...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Electronic Colloquium on Computational Complexity (ECCC)

دوره 14  شماره 

صفحات  -

تاریخ انتشار 2007